Bayesian Online Algorithms for Learning in Discrete Hidden Markov Models
نویسندگان
چکیده
We propose and analyze two different Bayesian online algorithms for learning in discrete Hidden Markov Models and compare their performance with the already known Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of generalization we draw learning curves in simplified situations for these algorithms and compare their performances. 1. Introduction. The unifying perspective of the Bayesian approach to machine learning allows the construction of efficient algorithms and sheds light on the characteristics they should have in order to attain such efficiency. In this paper we construct and characterize the performance of mean field online algorithms for discrete Hidden Markov Models (HMM) [5, 9] derived from approximations to a fully Bayesian algorithm. HMMs form a class of graphical models used to model the behavior of time series. They have a wide range of applications which includes speech recognition [9], DNA and protein analysis [3, 4] and econometrics [10]. Discrete HMMs are defined by an underlying Markov chain with hidden states q t , and B do not depend on time the HMM is said homogeneous, this is a simplifying assumption, which is not needed for online learning. The observed states y t of the HMM represent the observations of the time series, i.e., a time series from time t = 1 to t = T is represented by the observed sequence y
منابع مشابه
Spatial count models on the number of unhealthy days in Tehran
Spatial count data is usually found in most sciences such as environmental science, meteorology, geology and medicine. Spatial generalized linear models based on poisson (poisson-lognormal spatial model) and binomial (binomial-logitnormal spatial model) distributions are often used to analyze discrete count data in which spatial correlation is observed. The likelihood function of these models i...
متن کاملAn Introduction to Hidden Markov Models and Bayesian Networks
We provide a tutorial on learning and inference in hidden Markov models in the context of the recent literature on Bayesian networks. This perspective makes it possible to consider novel generalizations of hidden Markov models with multiple hidden state variables, multiscale representations, and mixed discrete and continuous variables. Although exact inference in these generalizations is usuall...
متن کاملOnline Learning in Discrete Hidden Markov Models
We present and analyze three different online algorithms for learning in discrete Hidden Markov Models (HMMs) and compare their performance with the Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of the generalization error we draw learning curves in simplified situations and compare the results. The performance for learning drifting concepts of one of the presented...
متن کاملApproximate Learning in Temporal Hidden Hopfield Models
Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low-dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined...
متن کاملBayesian nonparametric hidden semi-Markov models
There is much interest in the Hierarchical Dirichlet Process Hidden Markov Model (HDPHMM) as a natural Bayesian nonparametric extension of the ubiquitous Hidden Markov Model for learning from sequential and time-series data. However, in many settings the HDP-HMM’s strict Markovian constraints are undesirable, particularly if we wish to learn or encode non-geometric state durations. We can exten...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007